A Bayesian information criterion for singular models

نویسنده

  • MATHIAS DRTON
چکیده

We consider approximate Bayesian model choice for model selection problems that involve models whose Fisher-information matrices may fail to be invertible along other competing submodels. Such singular models do not obey the regularity conditions underlying the derivation of Schwarz’s Bayesian information criterion (BIC) and the penalty structure in BIC generally does not reflect the frequentist large-sample behavior of their marginal likelihood. While large-sample theory for the marginal likelihood of singular models has been developed recently, the resulting approximations depend on the true parameter value and lead to a paradox of circular reasoning. Guided by examples such as determining the number of components of mixture models, the number of factors in latent factor models or the rank in reduced-rank regression, we propose a resolution to this paradox and give a practical extension of BIC for singular model selection problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Optimum Design Criterion for Multi Models Discrimination

The problem of obtaining the optimum design, which is able to discriminate between several rival models has been considered in this paper. We give an optimality-criterion, using a Bayesian approach. This is an extension of the Bayesian KL-optimality to more than two models. A modification is made to deal with nested models. The proposed Bayesian optimality criterion is a weighted average, where...

متن کامل

A widely applicable Bayesian information criterion

A statistical model or a learning machine is called regular if the map taking a parameter to a probability distribution is one-to-one and if its Fisher information matrix is always positive definite. If otherwise, it is called singular. In regular statistical models, the Bayes free energy, which is defined by the minus logarithm of Bayes marginal likelihood, can be asymptotically approximated b...

متن کامل

Waic and Wbic Are Information Criteria for Singular Statistical Model Evaluation

Many statistical models and learning machines which have hierarchical structures, hidden variables, and grammatical rules are not regular but singular statistical models. In singular models, the log likelihood function can not be approximated by any quadratic form of a parameter, resulting that conventional information criteria such as AIC, BIC, TIC or DIC can not be used for model evaluation. ...

متن کامل

Bayesian Melding of Deterministic Models and Kriging for Analysis of Spatially Dependent Data

The link between geographic information systems and decision making approach own the invention and development of spatial data melding method. These methods combine different data sets, to achieve better results. In this paper, the Bayesian melding method for combining the measurements and outputs of deterministic models and kriging are considered. Then the ozone data in Tehran city are analyze...

متن کامل

Information-based inference in sloppy and singular models

The failure of the information-based Akaike Information Criterion (AIC) in the context of singular models can be rectified by the definition of a Frequentist Information Criterion (FIC). FIC applies a frequentist approximation to the computation of the model complexity, which can be estimated analytically in many contexts. Like AIC, FIC can be understood as an unbiased estimator of the model pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013